Stochastic Differential Equations: Lab 2


In [1]:
from IPython.core.display import HTML
css_file = 'https://raw.githubusercontent.com/ngcm/training-public/master/ipython_notebook_styles/ngcmstyle.css'
HTML(url=css_file)


Out[1]:

This background for these exercises is article of D Higham, An Algorithmic Introduction to Numerical Simulation of Stochastic Differential Equations, SIAM Review 43:525-546 (2001). Higham provides Matlab codes illustrating the basic ideas at http://personal.strath.ac.uk/d.j.higham/algfiles.html, which are also given in the paper.


In [2]:
%matplotlib inline
import numpy
from matplotlib import pyplot
from matplotlib import rcParams
rcParams['font.family'] = 'serif'
rcParams['font.size'] = 16
rcParams['figure.figsize'] = (12,6)
from scipy.integrate import quad

Further Stochastic integrals

Quick recap: the key feature is the Ito stochastic integral

\begin{equation} \int_{t_0}^t G(t') \, \text{d}W(t') = \text{mean-square-}\lim_{n\to +\infty} \left\{ \sum_{i=1}^n G(t_{i-1}) (W_{t_i} - W_{t_{i-1}} ) \right\} \end{equation}

where the key point for the Ito integral is that the first term in the sum is evaluated at the left end of the interval ($t_{i-1}$).

Now we use this to write down the SDE

\begin{equation} \text{d}X_t = f(X_t) \, \text{d}t + g(X_t) \, \text{d}W_t \end{equation}

with formal solution

\begin{equation} X_t = X_0 + \int_0^t f(X_s) \, \text{d}s + \int_0^t g(X_s) \, \text{d}W_s. \end{equation}

Using the Ito stochastic integral formula we get the Euler-Maruyama method

\begin{equation} X_{n+1} = X_n + \delta t \, f(X_n) + \sqrt{\delta t} \xi_n \, g(X_n) \end{equation}

by applying the integral over the region $[t_n, t_{n+1} = t_n + \delta t]$. Here $\delta t$ is the width of the interval and $\xi_n$ is the normal random variable $\xi_n \sim N(0, 1)$.

Normal chain rule

If

\begin{equation} \frac{\text{d}X}{\text{d}t} = f(X_t) \end{equation}

and we want to find the differential equation satisfied by $h(X(t))$ (or $h(X_t)$), then we write

\begin{align} &&\frac{\text{d}}{\text{d}t} h(X_t) &= h \left( X(t) + \text{d}X(t) \right) - h(X(t)) \\ &&&\simeq h(X(t)) + \text{d}X \, h'(X(t)) + \frac{1}{2} (\text{d}X)^2 \, h''(X(t)) + \dots - h(X(t)) \\ &&&\simeq f(X) h'(X) \text{d}t + \frac{1}{2} (f(X))^2 h''(X) (\text{d}t)^2 + \dots \\ \implies && \frac{\text{d} h(X)}{dt} &= f(X) h'(X). \end{align}

Stochastic chain rule

Now run through the same steps using the equation

\begin{equation} \text{d}X = f(X)\, \text{d}t + g(X) \, \text{d}W. \end{equation}

We find

\begin{align} && \text{d}h &\simeq h'(X(t))\, \text{d}X + \frac{1}{2} h''(X(t)) (\text{d}X)^2 + \dots, \\ &&&\simeq h'(X) f(X)\, \text{d}t + h'(X) g(X) ', \text{d}W + \frac{1}{2} \left( f(X) \text{d}t^2 + 2 f(x)g(x)\, \text{d}t dW + g^2(x) \text{d}W^2 \right) \\ \implies && \text{d}h &= \left( f(X) h'(X) + \frac{1}{2} h''(X)g^2(X) \right) \, \text{d}t + h'(X) g(X) \, \text{d}W. \end{align}

This additional $g^2$ term makes all the difference when deriving numerical methods, where the chain rule is repeatedly used.

Using this result

Remember that

\begin{equation} \int_{t_0}^t W_s \, \text{d}W_s = \frac{1}{2} W^2_t - \frac{1}{2} W^2_{t_0} - \frac{1}{2} (t - t_0). \end{equation}

From this we need to identify the stochastic differential equation, and also the function $h$, that will give us this result just from the chain rule.

The SDE is

\begin{equation} \text{d}X_t = \text{d}W_t, \quad f(X) = 0, \quad g(X) = 1. \end{equation}

Writing the chain rule down in the form

\begin{equation} h(X_t) = h(X_0) + \int_0^t \left( f(X_s) h'(X_s) + \frac{1}{2} h''(X_s) g^2(X_s) \right) \, \text{d}t + \int_0^t h'(X_s) g(X_s) \, \text{d}W_s. \end{equation}

Matching the final term (the integral over $\text{d}W_s$) we see that we need $h'$ to go like $X$, or

\begin{equation} h = X^2, \quad \text{d}X_t = \text{d}W_t, \quad f(X) = 0, \quad g(X) = 1. \end{equation}

With $X_t = W_t$ we therefore have

\begin{align} W_t^2 &= W_0^2 + \int_{t_0}^t \frac{1}{2} 2 \, \text{d}s + \int_{t_0}^t 2 W_s \, \text{d}W_s &= W_0^2 + (t - t_0) + \int_{t_0}^t 2 W_s \, \text{d}W_s \end{align}

as required.

Milstein's method

Using our chain rule we can construct higher order methods for stochastic differential equations. Milstein's method, applied to the SDE $$ \text{d}X = f(X) \, \text{d}t + g(X) \,\text{d}W, $$ is $$ X_{n+1} = X_n + h f_n + g_n \, \text{d}W_{n} + \tfrac{1}{2} g_n g'_n \left( \text{d}W_{n}^2 - h \right). $$

Tasks

Implement Milstein's method, applied to the problem in the previous lab: $$ \begin{equation} \text{d}X(t) = \lambda X(t) \, \text{d}t + \mu X(t) \text{d}W(t), \qquad X(0) = X_0. \end{equation} $$

Choose any reasonable values of the free parameters $\lambda, \mu, X_0$.

The exact solution to this equation is $X(t) = X(0) \exp \left[ \left( \lambda - \tfrac{1}{2} \mu^2 \right) t + \mu W(t) \right]$. Fix the timetstep and compare your solution to the exact solution.


In [ ]:

Check the convergence again.


In [ ]:

Compare the performance of the Euler-Maruyama and Milstein method using eg timeit. At what point is one method better than the other?


In [ ]:

Population problem

Apply the algorithms, convergence and performance tests to the SDE

$$ \begin{equation} \text{d}X(t) = r X(t) (K - X(t)) \, \text{d}t + \beta X(t) \,\text{d}W(t), \qquad X(0) = X_0. \end{equation} $$

Use the parameters $r = 2, K = 1, \beta = 0.25, X_0 = 0.5$.


In [3]:
r = 2.0
K = 1.0
beta = 0.25
X0 = 0.5
T   = 1.0

In [ ]:

Investigate how the behaviour varies as you change the parameters $r, K, \beta$.


In [ ]: